Homogeneous Systems of Linear Constant-Coefficient Differential Equations
Introduction
A system of linear constant-coefficient differential equations can be written in matrix form as
where is an unknown vector-valued function, is a constant coefficient matrix, and is the forcing term. When , we have a homogeneous system. In the homogeneous case, is always a solution and the set of all solutions forms an -dimensional vector space.
As in the single dimensional case, linearity leads to superposition, and specifying an initial condition determines a unique solution.
A key feature of the system is that all solution behavior is encoded in the matrix . The eigenvalues of determine the behavior of the system, and the eigenvectors (or generalized eigenvectors) determine the directions in which these behaviors occur.
Motivation: One-Dimensional Constant-Coefficient Linear ODEs
Recall that for a one-dimensional constant-coefficient linear ODE, we typically seek an exponential trial solution of the form . For example, for the second-order homogeneous equation
trying leads to the characteristic equation
which produces three standard cases:
- **Distinct real roots :} .
- **Repeated real root :} .
- **Complex roots :} .
Generalization to Dimensions: Eigenvalues and Eigenvectors
For the homogeneous system
we use the analogous exponential ansatz
where is a nonzero constant vector. Differentiating gives
Substituting into and cancelling the always nonzero factor yields
so nontrivial solutions exist only when
You may recall that this is precisely the characteristic equation of the matrix . Solving the equation for retrieves the eigenvalues of .
For each eigenvalue—eigenvector pair , we obtain a solution . The eigenvector specifies the direction in space along which the exponential behavior occurs.
As with scalar equations, there are three important cases for systems. In each case, the solution is built from eigenvalues and (generalized) eigenvectors, in direct analogy with characteristic roots and repeated roots in the one-dimensional setting.
Distinct Real Eigenvalues
Assume has distinct real eigenvalues with corresponding eigenvectors . Then the vector functions
are linearly independent solutions of . The general solution is the linear combination
where the constants are determined by an initial condition .
Repeated Eigenvalues (Generalized Eigenvectors)
Suppose is an eigenvalue of with multiplicity . There are two cases for the eigenvectors:
- ** eigenvectors:} we get linearly independent eigenvectors.
- ** eigenvectors:} we get linearly independent eigenvectors, where .
Recall that we will need linearly independent eigenvectors to create a basis. We first count how many eigenvectors we have, and then use generalized eigenvectors to make up the difference.
To find eigenvectors for , recall that we solve the homogeneous linear system
Row-reducing to RREF can let us count the number of eigenvectors relatively quickly: if the RREF has pivots, then there are free variables, so we get linearly independent eigenvectors for .
If there are eigenvectors, the eigenvectors will not give us a complete basis of solutions. To complete the basis for our solution space, we must find generalized eigenvectors.
Let be an eigenvector for , so . A generalized eigenvector is a vector that solves the linear system
Once we have and , we obtain two independent solutions
Then the general solution contributed by this repeated eigenvalue is
If we still need more eigenvectors, we can continue the process by solving , and so on. Each new generalized eigenvector produces another solution with a higher power of multiplying .
In general, if has algebraic multiplicity and we can build generalized eigenvectors with the recurrence relation
then we obtain independent solutions
We can think of this as an analogy to the exponential response formula.
Complex Eigenvalues
For real matrices , nonreal eigenvalues occur in conjugate pairs with . If is an eigenvector for (with ), then the complex solution
has real and imaginary parts that are real solutions. Using Euler’s formula, we obtain the real solution pair
This same construction generalizes directly to dimensions: each complex conjugate eigenvalue pair produces two linearly independent real solutions obtained from the real and imaginary parts of the corresponding complex solution.
If is a single complex conjugate pair, then the general (real) solution contributed by this pair is
with real constants .
Examples
Distinct Real Eigenvalues
Consider the initial value problem
Since is upper triangular, its eigenvalues are the diagonal entries and .
For , we solve :
For , we solve :
Thus the general solution is
Applying the initial condition and solving for and
gives and , so
Repeated Eigenvalues with Generalized Eigenvector
Consider the homogeneous system
The characteristic polynomial is , so has a repeated eigenvalue .
Solving gives
Note that as there is 1 pivot, we have inearly independent eigenvectors.
To obtain a second independent solution, we find a generalized eigenvector by solving the linear system
Here
so must satisfy
From the first row we get , and is free, so one convenient choice is
Then two independent solutions are
and therefore the general solution is
Repeated Eigenvalues with Two Eigenvectors
Consider
Here has multiplicity , and every nonzero vector is an eigenvector. In particular, two linearly independent eigenvectors are
Thus two independent solutions are and , and the general solution is
Complex Eigenvalues
Consider
The eigenvalues are (a complex conjugate pair). Using the formulas from the complex-eigenvalue case, we obtain two real solutions
and therefore the general solution is